428 research outputs found
δ-subgaussian Random Variables in Cryptography
In the Ring-LWE literature, there are several works that use a statistical framework based on delta-subgaussian random variables. These were introduced by Miccancio and Peikert (Eurocrypt 2012) as a relaxation of subgaussian random variables. In this paper, we completely characterise delta-subgaussian random variables. In particular, we show that this relaxation from a subgaussian random variable corresponds only to the shifting of the mean. Next, we give an alternative noncentral formulation for a delta-subgaussian random variable, which we argue is more statistically natural. This formulation enables us to extend prior results on sums of delta-subgaussian random variables, and on their discretisation
A New View on Worst-Case to Average-Case Reductions for NP Problems
We study the result by Bogdanov and Trevisan (FOCS, 2003), who show that
under reasonable assumptions, there is no non-adaptive worst-case to
average-case reduction that bases the average-case hardness of an NP-problem on
the worst-case complexity of an NP-complete problem. We replace the hiding and
the heavy samples protocol in [BT03] by employing the histogram verification
protocol of Haitner, Mahmoody and Xiao (CCC, 2010), which proves to be very
useful in this context. Once the histogram is verified, our hiding protocol is
directly public-coin, whereas the intuition behind the original protocol
inherently relies on private coins
Symbolic Encryption with Pseudorandom Keys
We give an efficient decision procedure that, on input two (acyclic)
cryptographic expressions making arbitrary use of an encryption scheme
and a (length doubling) pseudorandom generator, determines (in polynomial time) if the two expressions produce computationally indistinguishable distributions for any pseudorandom generator and encryption scheme satisfying the standard security notions of pseudorandomness and indistinguishability under chosen plaintext attack.
The procedure works by mapping each expression to a symbolic pattern that captures, in a fully abstract way, the information revealed by the expression to a computationally bounded observer. We then prove that if any two (possibly cyclic) expressions are mapped to the same
pattern, then the associated distributions are indistinguishable.
At the same time, if the expressions are mapped to different symbolic
patterns and do not contain encryption cycles, there are secure
pseudorandom generators and encryption schemes for which the two
distributions can be distinguished with overwhelming advantage
Security considerations for Galois non-dual RLWE families
We explore further the hardness of the non-dual discrete variant of the
Ring-LWE problem for various number rings, give improved attacks for certain
rings satisfying some additional assumptions, construct a new family of
vulnerable Galois number fields, and apply some number theoretic results on
Gauss sums to deduce the likely failure of these attacks for 2-power cyclotomic
rings and unramified moduli
Quantum Lightning Never Strikes the Same State Twice
Public key quantum money can be seen as a version of the quantum no-cloning
theorem that holds even when the quantum states can be verified by the
adversary. In this work, investigate quantum lightning, a formalization of
"collision-free quantum money" defined by Lutomirski et al. [ICS'10], where
no-cloning holds even when the adversary herself generates the quantum state to
be cloned. We then study quantum money and quantum lightning, showing the
following results:
- We demonstrate the usefulness of quantum lightning by showing several
potential applications, such as generating random strings with a proof of
entropy, to completely decentralized cryptocurrency without a block-chain,
where transactions is instant and local.
- We give win-win results for quantum money/lightning, showing that either
signatures/hash functions/commitment schemes meet very strong recently proposed
notions of security, or they yield quantum money or lightning.
- We construct quantum lightning under the assumed multi-collision resistance
of random degree-2 systems of polynomials.
- We show that instantiating the quantum money scheme of Aaronson and
Christiano [STOC'12] with indistinguishability obfuscation that is secure
against quantum computers yields a secure quantum money schem
An Equivalence Between Attribute-Based Signatures and Homomorphic Signatures, and New Constructions for Both
In Attribute-Based Signatures (ABS; first defined by Maji, Prabhakaran and Rosulek, CT-RSA 2011) an authority can generate multiple signing keys, where each key is associated with a constraint . A key respective to can sign a message only if . The security requirements are unforgeability and key privacy (signatures should not expose the specific signing key used). In Homomorphic Signatures (HS; first defined by Boneh and Freeman, PKC 2011), given a signature for a data-set , one can evaluate a signature for the pair , for functions . In context-hiding HS, evaluated signatures do not reveal information about the pre-evaluated signature.
In this work we start by showing that these two notions are in fact equivalent. The first implication of this equivalence is a new lattice-based ABS scheme for polynomial-depth circuits, based on the HS construction of Gorbunov, Vaikuntanathan and Wichs (GVW; STOC 2015).
We then construct a new ABS candidate from a worst case lattice assumption (SIS), with different parameters. Using our equivalence again, now in the opposite direction, our new ABS implies a new lattice-based HS scheme with different parameter trade-off, compared to the aforementioned GVW
Sampling the Integers with Low Relative Error
Randomness is an essential part of any secure cryptosystem, but many constructions rely on distributions that are not uniform. This is particularly true for lattice based cryptosystems, which more often than not make use of discrete Gaussian distributions over the integers. For practical purposes it is crucial to evaluate the impact that approximation errors have on the security of a scheme to provide the best possible trade-off between security and performance. Recent years have seen surprising results allowing to use relatively low precision while maintaining high levels of security. A key insight in these results is that sampling a distribution with low relative error can provide very strong security guarantees. Since floating point numbers provide guarantees on the relative approximation error, they seem a suitable tool in this setting, but it is not obvious which sampling algorithms can actually profit from them. While previous works have shown that inversion sampling can be adapted to provide a low relative error (Pöppelmann et al., CHES 2014; Prest, ASIACRYPT 2017), other works have called into question if this is possible for other sampling techniques (Zheng et al., Eprint report 2018/309). In this work, we consider all sampling algorithms that are popular in the cryptographic setting and analyze the relationship of floating point precision and the resulting relative error. We show that all of the algorithms either natively achieve a low relative error or can be adapted to do so
Gradual sub-lattice reduction and a new complexity for factoring polynomials
We present a lattice algorithm specifically designed for some classical
applications of lattice reduction. The applications are for lattice bases with
a generalized knapsack-type structure, where the target vectors are boundably
short. For such applications, the complexity of the algorithm improves
traditional lattice reduction by replacing some dependence on the bit-length of
the input vectors by some dependence on the bound for the output vectors. If
the bit-length of the target vectors is unrelated to the bit-length of the
input, then our algorithm is only linear in the bit-length of the input
entries, which is an improvement over the quadratic complexity floating-point
LLL algorithms. To illustrate the usefulness of this algorithm we show that a
direct application to factoring univariate polynomials over the integers leads
to the first complexity bound improvement since 1984. A second application is
algebraic number reconstruction, where a new complexity bound is obtained as
well
On the semantic security of functional encryption schemes
Functional encryption (FE) is a powerful cryptographic primitive that generalizes many asymmetric encryption systems proposed in recent years. Syntax and security definitions for FE were proposed by Boneh, Sahai, and Waters (BSW) (TCC 2011) and independently by O’Neill (ePrint 2010/556). In this paper we revisit these definitions, identify several shortcomings in them, and propose a new definitional approach that overcomes these limitations. Our definitions display good compositionality properties and allow us to obtain new feasibility and impossibility results for adaptive token-extraction attack scenarios that shed further light on the potential reach of general FE for practical applications.ENIAC Joint UndertakingFundação para a Ciência e a Tecnologia (FCT
Solving the Shortest Vector Problem in Lattices Faster Using Quantum Search
By applying Grover's quantum search algorithm to the lattice algorithms of
Micciancio and Voulgaris, Nguyen and Vidick, Wang et al., and Pujol and
Stehl\'{e}, we obtain improved asymptotic quantum results for solving the
shortest vector problem. With quantum computers we can provably find a shortest
vector in time , improving upon the classical time
complexity of of Pujol and Stehl\'{e} and the of Micciancio and Voulgaris, while heuristically we expect to find a
shortest vector in time , improving upon the classical time
complexity of of Wang et al. These quantum complexities
will be an important guide for the selection of parameters for post-quantum
cryptosystems based on the hardness of the shortest vector problem.Comment: 19 page
- …